Combinatorial Information Theory: I. Philosophical Basis of Cross-Entropy and Entropy

نویسنده

  • Robert K. Niven
چکیده

This study critically analyses the information-theoretic, axiomatic and combinatorial philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is shown to be the most fundamental (most primitive) of these three bases, since it gives (i) a derivation for the Kullback-Leibler cross-entropy and Shannon entropy functions, as simplified forms of the multinomial distribution subject to the Stirling approximation; (ii) an explanation for the need to maximize entropy (or minimize cross-entropy) to find the most probable realization of a system; and (iii) new, generalized definitions of entropy and cross-entropy supersets of the Boltzmann principle applicable to non-multinomial systems. The combinatorial basis is therefore of much broader scope, with far greater power of application, than the information-theoretic and axiomatic bases. The generalized definitions underpin a new discipline of “combinatorial information theory”, for the analysis of probabilistic systems of any type. Jaynes’ generic formulation of statistical mechanics for multinomial systems is re-examined in light of the combinatorial approach, including the analysis of probability distributions, ensemble theory, Jaynes relations, fluctuation theory and the entropy concentration theorem. Several new concepts are outlined, including a generalized Clausius inequality, a generalized free energy (“free information”) function, and a generalized Gibbs-Duhem relation and phase rule. For nonmultinomial systems, the generalized approach provides a different framework for the reinterpretation of the many alternative entropy measures (e.g. Bose-Einstein, Fermi-Dirac, Rényi, Tsallis, Sharma-Mittal, Beck-Cohen, Kaniadakis) in terms of their combinatorial structure. A connection between the combinatorial and Bayesian approaches is also explored. PACS numbers: 02.50.Cw, 02.50.Tt, 05.20.-y, 05.40.-a, 05.70.-a, 05.70.Ce, 05.90.+m, 89.20.-a, 89.70.+c,

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A pr 2 00 7 Combinatorial Information Theory : I . Philosophical Basis of Cross - Entropy and Entropy

This study critically analyses the information-theoretic, axiomatic and combinatorial philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is shown to be the most fundamental (most primitive) of these three bases, since it gives (i) a derivation for the Kullback-Leibler cross-entropy and Shannon entropy functions, as simplified forms of the multinomial distribu...

متن کامل

ar X iv : c on d - m at / 0 51 20 17 v 2 9 J an 2 00 6 Combinatorial Information Theory : I . Philosophical Basis of Cross - Entropy and Entropy

The three main theoretical bases of the concepts of entropy and cross-entropy informationtheoretic, axiomatic and combinatorial are critically examined. It is shown that the combinatorial basis, proposed by Boltzmann and Planck, is the most fundamental (most primitive) basis of these concepts, since it provides (i) a derivation of the Kullback-Leibler cross-entropy and Shannon entropy functions...

متن کامل

Evaluation of monitoring network density using discrete entropy theory

The regional evaluation of monitoring stations for water resources can be of great importance due to its role in finding appropriate locations for stations, the maximum gathering of useful information and preventing the accumulation of unnecessary information and ultimately reducing the cost of data collection. Based on the theory of discrete entropy, this study analyzes the density of rain gag...

متن کامل

Regional Evaluation of Hydrometric Monitoring Stations through Using Entropy Theory

Proper design and operation of monitoring systems for water resources management is one of themost important issues of water quality and quantity and accuracy and adequacy of data. The properevaluation of these data has a determining role in the correctand consistent decisions in the areacovered by the system. Therefore, determining proper distribution and number of monitoringnetwork stations a...

متن کامل

Origins of the Combinatorial Basis of Entropy

The combinatorial basis of entropy, given by Boltzmann, can be written H = N−1 lnW, where H is the dimensionless entropy, N is the number of entities and W is number of ways in which a given realization of a system can occur (its statistical weight). This can be broadened to give generalized combinatorial (or probabilistic) definitions of entropy and cross-entropy: H = κ(φ(W) +C) and D = −κ(φ(P...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/cond-mat/0512017  شماره 

صفحات  -

تاریخ انتشار 2005